文档

Dataphin中数据集成任务运行报错“java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided”

更新时间:
一键部署

问题描述

Dataphin中数据集成任务运行报错“java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided”。

具体日志如下:

2023-02-07 09:00:13.586 [job-1380840] INFO  RetryInvocationHandler - Exception while invoking getFileInfo of class ClientNamenodeProtocolTranslatorPB over rm02.pbwear.com/172.XX.XX.124:8020 after 1 fail over attempts. Trying to fail over immediately.

java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clock skew too great (37))]; Host Details : local host is: "mesos02.pbwear.com/172.XX.XXX.148"; destination host is: "rm02.XXr.com":8020; 
 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client.call(Client.java:1508) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client.call(Client.java:1441) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:231) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at com.sun.proxy.$Proxy19.getFileInfo(Unknown Source) ~[na:na]
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:788) ~[hadoop-hdfs-2.6.0-cdh5.16.2.jar:na]
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_152]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_152]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_152]
 at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_152]
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source) [na:na]
 at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2168) [hadoop-hdfs-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1266) [hadoop-hdfs-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262) [hadoop-hdfs-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) [hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1262) [hadoop-hdfs-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418) [hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at com.alibaba.datax.plugin.hadoop.BaseDfsUtil.lambda$exists$3(BaseDfsUtil.java:475) [plugin-hadoop-util-0.0.1-SNAPSHOT.jar:na]
 at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_152]
 at javax.security.auth.Subject.doAs(Subject.java:360) ~[na:1.8.0_152]
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1904) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at com.alibaba.datax.plugin.hadoop.BaseDfsUtil.exists(BaseDfsUtil.java:473) [plugin-hadoop-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.hadoop.BaseDfsUtil.isPathExists(BaseDfsUtil.java:286) [plugin-hadoop-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.hadoop.BaseDfsUtil.<init>(BaseDfsUtil.java:160) [plugin-hadoop-util-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsHelper.<init>(HdfsHelper.java:54) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.datax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:67) ~[hdfswriter-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTransRunner.initJobWriter(DlinkTransRunner.java:77) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.doInit(DlinkTrans.java:263) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.trans.DlinkTrans.start(DlinkTrans.java:108) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.runTrans(Engine.java:89) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.entry(Engine.java:172) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
 at com.alibaba.dt.dlink.core.Engine.main(Engine.java:246) ~[dlink-engine-0.0.1-SNAPSHOT.jar:na]
Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Clock skew too great (37))]
 at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:718) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_152]
 at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_152]
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:681) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:769) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client.call(Client.java:1480) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 ... 32 common frames omitted
Caused by: javax.security.sasl.SaslException: GSS initiate failed
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[na:1.8.0_152]
 at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:594) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:396) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:761) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:757) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_152]
 at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_152]
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:756) ~[hadoop-common-2.6.0-cdh5.16.2.jar:na]
 ... 35 common frames omitted
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Clock skew too great (37))
 at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770) ~[na:1.8.0_152]
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[na:1.8.0_152]
 at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[na:1.8.0_152]
 at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[na:1.8.0_152]
 ... 44 common frames omitted
Caused by: sun.security.krb5.internal.KrbApErrException: Clock skew too great (37)
 at sun.security.krb5.KrbKdcRep.check(KrbKdcRep.java:89) ~[na:1.8.0_152]
 at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:87) ~[na:1.8.0_152]
 at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[na:1.8.0_152]
 at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[na:1.8.0_152]
 at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[na:1.8.0_152]
 at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[na:1.8.0_152]
 at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[na:1.8.0_152]
 at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[na:1.8.0_152]
 ... 47 common frames omitted

问题原因

kerberos认证机器与客户的服务端时钟不同步导致该问题。

解决方案

需要找部署同学将Dataphin和Mesos的机器的时钟调整与客户的认证服务器时钟保持一致。

适用于

  • Dataphin
  • 本页导读
文档反馈